51 research outputs found

    Mesh-Free Hydrodynamic Stability

    Full text link
    A specialized mesh-free radial basis function-based finite difference (RBF-FD) discretization is used to solve the large eigenvalue problems arising in hydrodynamic stability analyses of flows in complex domains. Polyharmonic spline functions with polynomial augmentation (PHS+poly) are used to construct the discrete linearized incompressible and compressible Navier-Stokes operators on scattered nodes. Rigorous global and local eigenvalue stability studies of these global operators and their constituent RBF stencils provide a set of parameters that guarantee stability while balancing accuracy and computational efficiency. Specialized elliptical stencils to compute boundary-normal derivatives are introduced and the treatment of the pole singularity in cylindrical coordinates is discussed. The numerical framework is demonstrated and validated on a number of hydrodynamic stability methods ranging from classical linear theory of laminar flows to state-of-the-art non-modal approaches that are applicable to turbulent mean flows. The examples include linear stability, resolvent, and wavemaker analyses of cylinder flow at Reynolds numbers ranging from 47 to 180, and resolvent and wavemaker analyses of the self-similar flat-plate boundary layer at a Reynolds number as well as the turbulent mean of a high-Reynolds-number transonic jet at Mach number 0.9. All previously-known results are found in close agreement with the literature. Finally, the resolvent-based wavemaker analyses of the Blasius boundary layer and turbulent jet flows offer new physical insight into the modal and non-modal growth in these flows

    High-throughput, Efficient, and Unbiased Capture of Small RNAs from Low-input Samples for Sequencing.

    Get PDF
    MicroRNAs hold great promise as biomarkers of disease. However, there are few efficient and robust methods for measuring microRNAs from low input samples. Here, we develop a high-throughput sequencing protocol that efficiently captures small RNAs while minimizing inherent biases associated with library production. The protocol is based on early barcoding such that all downstream manipulations can be performed on a pool of many samples thereby reducing reagent usage and workload. We show that the optimization of adapter concentrations along with the addition of nucleotide modifications and random nucleotides increases the efficiency of small RNA capture. We further show, using unique molecular identifiers, that stochastic capture of low input RNA rather than PCR amplification influences the biased quantitation of intermediately and lowly expressed microRNAs. Our improved method allows the processing of tens to hundreds of samples simultaneously while retaining high efficiency quantitation of microRNAs in low input samples from tissues or bodily fluids

    CONSTITUTIONAL AND INSTITUTIONAL STRUCTURAL DETERMINANTS OF POLICY RESPONSIVENESS TO PROTECT CITIZENS FROM EXISTENTIAL THREATS: COVID-19 AND BEYOND

    Get PDF
    A multitude of government forms and institutional variations have the same aims of serving their countries and citizens but vary in outcomes. What it means to best serve the citizens is, however, a matter of broad interpretation and so the disagreements persist. The ongoing COVID-19 pandemic creates new metrics for comparing government performance – the metrics of human deaths, or, alternatively and as we pursue it here, the metrics of the speed of government response in preventing human deaths through policy adoption. We argue in this essay that institutional and government systems with more authority redundancies are more likely to rapidly generate policy in response to crisis and find better policy solutions compared to centralized systems with minimal authority redundancies. This is due to a multiplicity of access points to policy making, which increase the chances of a policymaker crafting the “correct” response to crisis, which can be replicated elsewhere. Furthermore, citizens in centralized and unitary governments must rely on national policymakers to get the correct response as subnational policymakers are highly constrained compared to their counterparts in decentralized systems. As policy authority is institutionally defined, these policy authority redundancies correspond to specific institutional and constitutional forms. In this paper, we provide a mathematical/formal model where we specifically analyze the contrast in the speed of policy response between more centralized and autocratic states versus democratic federations

    Generalized Contour Dynamics: A Review

    Get PDF
    Contour dynamics is a computational technique to solve for the motion of vortices in incompressible inviscid flow. It is a Lagrangian technique in which the motion of contours is followed, and the velocity field moving the contours can be computed as integrals along the contours. Its best-known examples are in two dimensions, for which the vorticity between contours is taken to be constant and the vortices are vortex patches, and in axisymmetric flow for which the vorticity varies linearly with distance from the axis of symmetry. This review discusses generalizations that incorporate additional physics, in particular, buoyancy effects and magnetic fields, that take specific forms inside the vortices and preserve the contour dynamics structure. The extra physics can lead to time-dependent vortex sheets on the boundaries, whose evolution must be computed as part of the problem. The non-Boussinesq case, in which density differences can be important, leads to a coupled system for the evolution of both mean interfacial velocity and vortex sheet strength. Helical geometry is also discussed, in which two quantities are materially conserved and whose evolution governs the flow

    CoNIC Challenge: Pushing the Frontiers of Nuclear Detection, Segmentation, Classification and Counting

    Get PDF
    Nuclear detection, segmentation and morphometric profiling are essential in helping us further understand the relationship between histology and patient outcome. To drive innovation in this area, we setup a community-wide challenge using the largest available dataset of its kind to assess nuclear segmentation and cellular composition. Our challenge, named CoNIC, stimulated the development of reproducible algorithms for cellular recognition with real-time result inspection on public leaderboards. We conducted an extensive post-challenge analysis based on the top-performing models using 1,658 whole-slide images of colon tissue. With around 700 million detected nuclei per model, associated features were used for dysplasia grading and survival analysis, where we demonstrated that the challenge's improvement over the previous state-of-the-art led to significant boosts in downstream performance. Our findings also suggest that eosinophils and neutrophils play an important role in the tumour microevironment. We release challenge models and WSI-level results to foster the development of further methods for biomarker discovery

    Robust estimation of bacterial cell count from optical density

    Get PDF
    Optical density (OD) is widely used to estimate the density of cells in liquid culture, but cannot be compared between instruments without a standardized calibration protocol and is challenging to relate to actual cell count. We address this with an interlaboratory study comparing three simple, low-cost, and highly accessible OD calibration protocols across 244 laboratories, applied to eight strains of constitutive GFP-expressing E. coli. Based on our results, we recommend calibrating OD to estimated cell count using serial dilution of silica microspheres, which produces highly precise calibration (95.5% of residuals <1.2-fold), is easily assessed for quality control, also assesses instrument effective linear range, and can be combined with fluorescence calibration to obtain units of Molecules of Equivalent Fluorescein (MEFL) per cell, allowing direct comparison and data fusion with flow cytometry measurements: in our study, fluorescence per cell measurements showed only a 1.07-fold mean difference between plate reader and flow cytometry data

    Evaluations of tropospheric aerosol properties simulated by the community earth system model with a sectional aerosol microphysics scheme

    Get PDF
    A sectional aerosol model (CARMA) has been developed and coupled with the Community Earth System Model (CESM1). Aerosol microphysics, radiative properties, and interactions with clouds are simulated in the size-resolving model. The model described here uses 20 particle size bins for each aerosol component including freshly nucleated sulfate particles, as well as mixed particles containing sulfate, primary organics, black carbon, dust, and sea salt. The model also includes five types of bulk secondary organic aerosols with four volatility bins. The overall cost of CESM1-CARMA is approximately ∼2.6 times as much computer time as the standard three-mode aerosol model in CESM1 (CESM1-MAM3) and twice as much computer time as the seven-mode aerosol model in CESM1 (CESM1-MAM7) using similar gas phase chemistry codes. Aerosol spatial-temporal distributions are simulated and compared with a large set of observations from satellites, ground-based measurements, and airborne field campaigns. Simulated annual average aerosol optical depths are lower than MODIS/MISR satellite observations and AERONET observations by ∼32%. This difference is within the uncertainty of the satellite observations. CESM1/CARMA reproduces sulfate aerosol mass within 8%, organic aerosol mass within 20%, and black carbon aerosol mass within 50% compared with a multiyear average of the IMPROVE/EPA data over United States, but differences vary considerably at individual locations. Other data sets show similar levels of comparison with model simulations. The model suggests that in addition to sulfate, organic aerosols also significantly contribute to aerosol mass in the tropical UTLS, which is consistent with limited data
    corecore